Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 25
Filter
1.
World J Surg ; 47(10): 2340-2346, 2023 10.
Article in English | MEDLINE | ID: mdl-37389644

ABSTRACT

BACKGROUND: Accurately predicting which patients are most likely to benefit from massive transfusion protocol (MTP) activation may help patients while saving blood products and limiting cost. The purpose of this study is to explore the use of modern machine learning (ML) methods to develop and validate a model that can accurately predict the need for massive blood transfusion (MBT). METHODS: The institutional trauma registry was used to identify all trauma team activation cases between June 2015 and August 2019. We used an ML framework to explore multiple ML methods including logistic regression with forward and backward selection, logistic regression with lasso and ridge regularization, support vector machines (SVM), decision tree, random forest, naive Bayes, XGBoost, AdaBoost, and neural networks. Each model was then assessed using sensitivity, specificity, positive predictive value, and negative predictive value. Model performance was compared to that of existing scores including the Assessment of Blood Consumption (ABC) and the Revised Assessment of Bleeding and Transfusion (RABT). RESULTS: A total of 2438 patients were included in the study, with 4.9% receiving MBT. All models besides decision tree and SVM attained an area under the curve (AUC) of above 0.75 (range: 0.75-0.83). Most of the ML models have higher sensitivity (0.55-0.83) than the ABC and RABT score (0.36 and 0.55, respectively) while maintaining comparable specificity (0.75-0.81; ABC 0.80 and RABT 0.83). CONCLUSIONS: Our ML models performed better than existing scores. Implementing an ML model in mobile computing devices or electronic health record has the potential to improve the usability.


Subject(s)
Blood Transfusion , Hemorrhage , Humans , Bayes Theorem , Hemorrhage/diagnosis , Hemorrhage/etiology , Hemorrhage/therapy , Blood Transfusion/methods , Predictive Value of Tests , Machine Learning
2.
J Acquir Immune Defic Syndr ; 92(5): 370-377, 2023 04 15.
Article in English | MEDLINE | ID: mdl-36728397

ABSTRACT

BACKGROUND: In response to the COVID-19 pandemic, San Francisco County (SFC) had to shift many nonemergency health care resources to COVID-19, reducing HIV control resources. We sought to quantify COVID-19 effects on HIV burden among men who have sex with men (MSM) as SFC returns to pre-COVID service levels and progresses toward the Ending the HIV Epidemic (EHE) goals. SETTING: Microsimulation model of MSM in SFC tracking HIV progression and treatment. METHODS: Scenario analysis where services affected by COVID-19 [testing, care engagement, pre-exposure prophylaxis (PrEP) uptake, and retention] return to pre-COVID levels by the end of 2022 or 2025, compared against a counterfactual where COVID-19 changes never occurred. We also examined scenarios where resources are prioritized to reach new patients or retain of existing patients from 2023 to 2025 before all services return to pre-COVID levels. RESULTS: The annual number of MSM prescribed PrEP, newly acquired HIV, newly diagnosed, and achieving viral load suppression (VLS) rebound quickly after HIV care returns to pre-COVID levels. However, COVID-19 service disruptions result in measurable reductions in cumulative PrEP use, VLS person-years, incidence, and an increase in deaths over the 2020-2035 period. The burden is statistically significantly larger if these effects end in 2025 instead of 2022. Prioritizing HIV care/prevention initiation over retention results in more person-years of PrEP but less VLS person-years and more deaths, influencing EHE PrEP outcomes. CONCLUSIONS: Earlier HIV care return to pre-COVID levels results in lower cumulative HIV burdens. Resource prioritization decisions may differentially affect different EHE goals.


Subject(s)
COVID-19 , HIV Infections , Pre-Exposure Prophylaxis , Sexual and Gender Minorities , Male , Humans , Homosexuality, Male , HIV Infections/drug therapy , HIV Infections/epidemiology , HIV Infections/prevention & control , San Francisco/epidemiology , Pandemics , COVID-19/epidemiology , Pre-Exposure Prophylaxis/methods
3.
Article in English | MEDLINE | ID: mdl-38550611

ABSTRACT

The ubiquity of missing values in real-world datasets poses a challenge for statistical inference and can prevent similar datasets from being analyzed in the same study, precluding many existing datasets from being used for new analyses. While an extensive collection of packages and algorithms have been developed for data imputation, the overwhelming majority perform poorly if there are many missing values and low sample sizes, which are unfortunately common characteristics in empirical data. Such low-accuracy estimations adversely affect the performance of downstream statistical models. We develop a statistical inference framework for regression and classification in the presence of missing data without imputation. Our framework, RIFLE (Robust InFerence via Low-order moment Estimations), estimates low-order moments of the underlying data distribution with corresponding confidence intervals to learn a distributionally robust model. We specialize our framework to linear regression and normal discriminant analysis, and we provide convergence and performance guarantees. This framework can also be adapted to impute missing data. In numerical experiments, we compare RIFLE to several state-of-the-art approaches (including MICE, Amelia, MissForest, KNN-imputer, MIDA, and Mean Imputer) for imputation and inference in the presence of missing values. Our experiments demonstrate that RIFLE outperforms other benchmark algorithms when the percentage of missing values is high and/or when the number of data points is relatively small. RIFLE is publicly available at https://github.com/optimization-for-data-driven-science/RIFLE.

4.
Kidney Med ; 4(12): 100563, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36479469

ABSTRACT

Rationale & Objective: Patients with a high-risk Apolipoprotein L1 (APOL1) genotype are more likely to develop chronic kidney disease and kidney failure. It is unclear whether this increased risk is entirely mediated by the development of proteinuria. Study Design: Retrospective observational study of the African American Study of Kidney Disease and Hypertension cohort and Chronic Renal Insufficiency Cohort. Exposures & Predictors: Self-identified race (Black/non-Black) and presence of high-risk APOL1 genotype. The primary model was adjusted for age, sex, diabetes, estimated glomerular filtration rate, and urinary protein-creatinine ratio. Outcomes: Time to kidney failure defined as time to dialysis or transplantation. Analytical Approach: We used Cox proportional hazard models to study how proteinuria mediates the association between APOL1 and kidney failure. We modeled proteinuria at baseline and as a time-varying covariate. Results: A high-risk APOL1 genotype was associated with a significantly higher risk of kidney failure, even for patients with minimal proteinuria (HR, 1.87; 95% CI, 1.23-2.84). The association was not significant among patients with high proteinuria (HR, 1.22; 95% CI, 0.93-1.61). When modeling proteinuria as a time-varying covariate, a high-risk APOL1 genotype was associated with higher kidney failure risk even among patients who never developed proteinuria (HR, 2.04; 95% CI, 1.10-3.77). Compared to non-Black patients, Black patients without the high-risk genotype did not have higher risk of kidney failure (HR, 0.96; 95% CI, 0.85-1.10). Limitations: Two datasets were combined to increase statistical power. Limited generalizability beyond the study cohorts. Residual confounding common to observational studies. Conclusions: A high-risk APOL1 genotype is significantly associated with increased kidney failure risk, especially among patients without baseline proteinuria. Although our results suggest that the risk is partially mediated through proteinuria, higher kidney failure risk was present even among patients who never developed proteinuria. Providers should consider screening for the high-risk APOL1 genotype, especially among Black patients without proteinuria in populations with chronic kidney disease.

5.
Am J Drug Alcohol Abuse ; 48(5): 618-628, 2022 09 03.
Article in English | MEDLINE | ID: mdl-36194086

ABSTRACT

Background: Most research on opioid misuse focuses on younger adults, yet opioid-related mortality has risen fastest among older Americans over age 55.Objectives: To assess whether there are differential patterns of opioid misuse over time between younger and older adults and whether South Carolina's mandatory Prescription Drug Monitoring Program (PDMP) affected opioid misuse differentially between the two groups.Methods: We used South Carolina's Reporting and Identification Prescription Tracking System from 2010 to 2018 to calculate an opioid misuse score for 193,073 patients (sex unknown) using days' supply, morphine milligram equivalents (MME), and the numbers of unique prescribers and dispensaries. Multivariable regression was used to assess differential opioid misuse patterns by age group over time and in response to implementation of South Carolina's mandatory PDMP in 2017.Results: We found that between 2011 and 2018, older adults received 57% (p < .01) more in total MME and 25.4 days more (p < .01) in supply, but received prescriptions from fewer doctors (-0.063 doctors, p < 01) and pharmacies (-0.11 pharmacies, p < 01) per year versus younger adults. However, older adults had lower odds of receiving a high misuse score (OR 0.88, p < .01). After the 2017 legislation, misuse scores fell among younger adults (OR 0.79, p < .01) relative to 2011, but not among older adults.Conclusion: Older adults may misuse opioids differently compared to younger adults. Assessment of policies to reduce opioid misuse should take into account subgroup differences that may be masked at the population level.


Subject(s)
Opioid-Related Disorders , Prescription Drug Misuse , Prescription Drug Monitoring Programs , Aged , Analgesics, Opioid/therapeutic use , Endrin/analogs & derivatives , Humans , Infant , Morphine Derivatives , Opioid-Related Disorders/drug therapy , Opioid-Related Disorders/epidemiology , Practice Patterns, Physicians' , South Carolina/epidemiology , United States
6.
AIDS Patient Care STDS ; 36(8): 300-312, 2022 08.
Article in English | MEDLINE | ID: mdl-35951446

ABSTRACT

Racial and ethnic minority men who have sex with men (MSM) are disproportionately affected by HIV/AIDS in Los Angeles County (LAC), an important epicenter in the battle to end HIV. We examine tradeoffs between effectiveness and equality of pre-exposure prophylaxis (PrEP) allocation strategies among different racial and ethnic groups of MSM in LAC and provide a framework for quantitatively evaluating disparities in HIV outcomes. To do this, we developed a microsimulation model of HIV among MSM in LAC using county epidemic surveillance and survey data to capture demographic trends and subgroup-specific partnership patterns, disease progression, patterns of PrEP use, and patterns for viral suppression. We limit analysis to MSM, who bear most of the burden of HIV/AIDS in LAC. We simulated interventions where 3000, 6000, or 9000 PrEP prescriptions are provided annually in addition to current levels, following different allocation scenarios to each racial/ethnic group (Black, Hispanic, or White). We estimated cumulative infections averted and measures of equality, after 15 years (2021-2035), relative to base case (no intervention). By comparing allocation strategies on the health equality impact plane, we find that, of the policies evaluated, targeting PrEP preferentially to Black individuals would result in the largest reductions in incidence and disparities across the equality measures we considered. This result was consistent over a range of PrEP coverage levels, demonstrating that there are "win-win" PrEP allocation strategies that do not require a tradeoff between equality and efficiency.


Subject(s)
Anti-HIV Agents , HIV Infections , Pre-Exposure Prophylaxis , Sexual and Gender Minorities , Anti-HIV Agents/therapeutic use , Ethnicity , HIV Infections/epidemiology , HIV Infections/prevention & control , Homosexuality, Male , Humans , Los Angeles/epidemiology , Male , Minority Groups , Policy
7.
J Acquir Immune Defic Syndr ; 90(S1): S167-S176, 2022 07 01.
Article in English | MEDLINE | ID: mdl-35703769

ABSTRACT

BACKGROUND: Pre-exposure prophylaxis (PrEP) is essential to ending HIV. Yet, uptake remains uneven across racial and ethnic groups. We aimed to estimate the impacts of alternative PrEP implementation strategies in Los Angeles County. SETTING: Men who have sex with men, residing in Los Angeles County. METHODS: We developed a microsimulation model of HIV transmission, with inputs from key local stakeholders. With this model, we estimated the 15-year (2021-2035) health and racial and ethnic equity impacts of 3 PrEP implementation strategies involving coverage with 9000 additional PrEP units annually, above the Status-quo coverage level. Strategies included PrEP allocation equally (strategy 1), proportionally to HIV prevalence (strategy 2), and proportionally to HIV diagnosis rates (strategy 3), across racial and ethnic groups. We measured the degree of relative equalities in the distribution of the health impacts using the Gini index (G) which ranges from 0 (perfect equality, with all individuals across all groups receiving equal health benefits) to 1 (total inequality). RESULTS: HIV prevalence was 21.3% in 2021 [Black (BMSM), 31.1%; Latino (LMSM), 18.3%, and White (WMSM), 20.7%] with relatively equal to reasonable distribution across groups (G, 0.28; 95% confidence interval [CI], 0.26 to 0.34). During 2021-2035, cumulative incident infections were highest under Status-quo (n = 24,584) and lowest under strategy 3 (n = 22,080). Status-quo infection risk declined over time among all groups but remained higher in 2035 for BMSM (incidence rate ratio, 4.76; 95% CI: 4.58 to 4.95), and LMSM (incidence rate ratio, 1.74; 95% CI: 1.69 to 1.80), with the health benefits equally to reasonably distributed across groups (G, 0.32; 95% CI: 0.28 to 0.35). Relative to Status-quo, all other strategies reduced BMSM-WMSM and BMSM-LMSM disparities, but none reduced LMSM-WMSM disparities by 2035. Compared to Status-quo, strategy 3 reduced the most both incident infections (% infections averted: overall, 10.2%; BMSM, 32.4%; LMSM, 3.8%; WMSM, 3.5%) and HIV racial inequalities (G reduction, 0.08; 95% CI: 0.02 to 0.14). CONCLUSIONS: Microsimulation models developed with early, continuous stakeholder engagement and inputs yield powerful tools to guide policy implementation.


Subject(s)
Anti-HIV Agents , HIV Infections , Pre-Exposure Prophylaxis , Sexual and Gender Minorities , Anti-HIV Agents/therapeutic use , HIV Infections/drug therapy , HIV Infections/epidemiology , HIV Infections/prevention & control , Homosexuality, Male , Humans , Los Angeles/epidemiology , Male
8.
BMC Urol ; 22(1): 76, 2022 May 13.
Article in English | MEDLINE | ID: mdl-35550071

ABSTRACT

BACKGROUND: To assess the price range in which fexapotide triflutate (FT), a novel injectable, is cost-effective relative to current oral pharmacotherapy (5 α-reductase inhibitor, α-blocker, 5 α-reductase inhibitor and α-blocker combination therapy) as initial therapy followed by surgery for moderate-to-severe benign prostate hyperplasia patients with lower urinary tract symptoms (BPH-LUTS). METHODS: We developed a microsimulation decision-analytic model to track the progression of BPH-LUTS and associated costs and quality-adjusted life years in the target population. The cost-effectiveness analysis was performed from Medicare's perspective with a time horizon of 4 years using 2019 US dollars for all costs. The microsimulation model considered treatment patterns associated with nonadherence to oral medication and progression to surgery. Model parameters were estimated from large randomized controlled trials, literature and expert opinion. For each initial treatment option, simulations were performed with 1000 iterations, with 1000 patients per iteration. RESULTS: Three upfront oral pharmacotherapy options are close in cost-effectiveness, with combination therapy being the most cost-effective option. Relative to upfront oral pharmacotherapy options, FT slightly increases quality-adjusted life years (QALY) per patient (1.870 (95% CI, 1.868 to 1.872) vs. 1.957 (95% CI, 1.955 to 1.959) QALYs). Under the willingness-to-pay (WTP) threshold of $150,000 per QALY, at price per injection of $14,000, FT is about as cost-effective as upfront oral pharmacotherapy options with net monetary benefit (NMB) $279,168.54. Under the WTP threshold of $50,000 per QALY, at price per injection of $5,000, FT is about as cost-effective as upfront oral pharmacotherapy options with NMB $92,135.18. In an alternative 10-year time horizon scenario, FT price per injection at $11,000 and $4,500 makes FT as cost-effective as oral pharmacotherapies. One-way sensitivity analysis showed this result is most sensitive to upfront therapy prices, FT efficacy and initial IPSS. At price per injections of $5,000, $10,000 and $15,000, the probability that FT is either cost-effective or dominant compared to upfront oral pharmacotherapy options using a WTP threshold of $150,000 per QALY is 100%, 93% and 40%, respectively. CONCLUSIONS: Compared to upfront oral pharmacotherapy options, FT would be cost-effective at a price per injection below $14,000, assuming a WTP threshold of $150,000 per QALY.


Subject(s)
Prostatic Hyperplasia , Aged , Cholestenone 5 alpha-Reductase , Cost-Benefit Analysis , Fluoroacetates , Humans , Hyperplasia , Male , Medicare , Peptides , Prostate , Prostatic Hyperplasia/surgery , United States
9.
Health Care Manag Sci ; 25(1): 1-23, 2022 Mar.
Article in English | MEDLINE | ID: mdl-34357488

ABSTRACT

There is strong evidence that diabetes is underdiagnosed in the US: the Centers for Disease Control and Prevention (CDC) estimates that approximately 25% of diabetic patients are unaware of their condition. To encourage timely diagnosis of at-risk patients, we develop screening guidelines stratified by body mass index (BMI), age, and prior test history by using a Partially Observed Markov Decision Process (POMDP) framework to provide more personalized screening frequency recommendations. We identify structural results that prove the existence of threshold solutions in our problem and allow us to determine the relative timing and frequency of screening given different risk profiles. We then use nationally representative empirical data to identify a policy that provides the optimal action (screen or wait) every six months from age 45 to 90. We find that the current screening guidelines are suboptimal, and the recommended diabetes screening policy should be stratified by age and by finer BMI thresholds than in the status quo. We identify age ranges and BMI categories for which relatively less or more screening is needed compared to the existing guidelines to help physicians target patients most at risk. Compared to the status quo, we estimate that an optimal screening policy would generate higher net monetary benefits by $3,200-$3,570 and save $120-$1,290 in health expenditures per individual in the US above age 45.


Subject(s)
Diabetes Mellitus , Mass Screening , Aged , Aged, 80 and over , Diabetes Mellitus/diagnosis , Humans , Markov Chains , Middle Aged
10.
JHEP Rep ; 3(6): 100367, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34825154

ABSTRACT

BACKGROUND & AIMS: Uncertainties exist surrounding the timing of liver transplantation (LT) among patients with acute-on-chronic liver failure grade 3 (ACLF-3), regarding whether to accept a marginal quality donor organ to allow for earlier LT or wait for either an optimal organ offer or improvement in the number of organ failures, in order to increase post-LT survival. METHODS: We created a Markov decision process model to determine the optimal timing of LT among patients with ACLF-3 within 7 days of listing, to maximize overall 1-year survival probability. RESULTS: We analyzed 6 groups of candidates with ACLF-3: patients age ≤60 or >60 years, patients with 3 organ failures alone or 4-6 organ failures, and hepatic or extrahepatic ACLF-3. Among all groups, LT yielded significantly greater overall survival probability vs. remaining on the waiting list for even 1 additional day (p <0.001), regardless of organ quality. Creation of 2-way sensitivity analyses, with variation in the probability of receiving an optimal organ and expected post-transplant mortality, indicated that overall survival is maximized by earlier LT, particularly among candidates >60 years old or with 4-6 organ failures. The probability of improvement from ACLF-3 to ACLF-2 does not influence these recommendations, as the likelihood of organ recovery was less than 10%. CONCLUSION: During the first week after listing for patients with ACLF-3, earlier LT in general is favored over waiting for an optimal quality donor organ or for recovery of organ failures, with the understanding that the analysis is limited to consideration of only these 3 variables. LAY SUMMARY: In the setting of grade 3 acute-on-chronic liver failure (ACLF-3), questions remain regarding the timing of transplantation in terms of whether to proceed with liver transplantation with a marginal donor organ or to wait for an optimal liver, and whether to transplant a patient with ACLF-3 or wait until improvement to ACLF-2. In this study, we used a Markov decision process model to demonstrate that earlier transplantation of patients listed with ACLF-3 maximizes overall survival, as opposed to waiting for an optimal donor organ or for improvement in the number of organ failures.

11.
PLoS One ; 16(7): e0254950, 2021.
Article in English | MEDLINE | ID: mdl-34288951

ABSTRACT

BACKGROUND: Tuberculosis (TB) incidence in Los Angeles County, California, USA (5.7 per 100,000) is significantly higher than the U.S. national average (2.9 per 100,000). Directly observed therapy (DOT) is the preferred strategy for active TB treatment but requires substantial resources. We partnered with the Los Angeles County Department of Public Health (LACDPH) to evaluate the cost-effectiveness of AiCure, an artificial intelligence (AI) platform that allows for automated treatment monitoring. METHODS: We used a Markov model to compare DOT versus AiCure for active TB treatment in LA County. Each cohort transitioned between health states at rates estimated using data from a pilot study for AiCure (N = 43) and comparable historical controls for DOT (N = 71). We estimated total costs (2017, USD) and quality-adjusted life years (QALYs) over a 16-month horizon to calculate the incremental cost-effectiveness ratio (ICER) and net monetary benefits (NMB) of AiCure. To assess robustness, we conducted deterministic (DSA) and probabilistic sensitivity analyses (PSA). RESULTS: For the average patient, AiCure was dominant over DOT. DOT treatment cost $4,894 and generated 1.03 QALYs over 16-months. AiCure treatment cost $2,668 for 1.05 QALYs. At willingness-to-pay threshold of $150K/QALY, incremental NMB per-patient under AiCure was $4,973. In univariate DSA, NMB were most sensitive to monthly doses and vocational nurse wage; however, AiCure remained dominant. In PSA, AiCure was dominant in 93.5% of 10,000 simulations (cost-effective in 96.4%). CONCLUSIONS: AiCure for treatment of active TB is cost-effective for patients in LA County, California. Increased use of AI platforms in other jurisdictions could facilitate the CDC's vision of TB elimination.


Subject(s)
Artificial Intelligence/economics , Tuberculosis/economics , Tuberculosis/therapy , Adult , Aged , California , Cost-Benefit Analysis , Female , Humans , Male , Middle Aged , Monitoring, Physiologic/economics , Pilot Projects
12.
Health Econ ; 30 Suppl 1: 30-51, 2021 11.
Article in English | MEDLINE | ID: mdl-32662080

ABSTRACT

Accurate future projections of population health are imperative to plan for the future healthcare needs of a rapidly aging population. Multistate-transition microsimulation models, such as the U.S. Future Elderly Model, address this need but require high-quality panel data for calibration. We develop an alternative method that relaxes this data requirement, using repeated cross-sectional representative surveys to estimate multistate-transition contingency tables applied to Japan's population. We calculate the birth cohort sex-specific prevalence of comorbidities using five waves of the governmental health surveys. Combining estimated comorbidity prevalence with death record information, we determine the transition probabilities of health statuses. We then construct a virtual Japanese population aged 60 and older as of 2013 and perform a microsimulation to project disease distributions to 2046. Our estimates replicate governmental projections of population pyramids and match the actual prevalence trends of comorbidities and the disease incidence rates reported in epidemiological studies in the past decade. Our future projections of cardiovascular diseases indicate lower prevalence than expected from static models, reflecting recent declining trends in disease incidence and fatality.


Subject(s)
Birth Cohort , Functional Status , Aged , Cross-Sectional Studies , Female , Forecasting , Humans , Japan/epidemiology , Male , Middle Aged
13.
Curr Alzheimer Res ; 17(9): 819-822, 2020.
Article in English | MEDLINE | ID: mdl-33272181

ABSTRACT

BACKGROUND: Recent trials suggest that disease-modifying therapy (DMT) for Alzheimer's disease may become available soon. With the expected high price and a large patient pool, the budget impact will be substantial. OBJECTIVE: We explore combinations of effectiveness and price under which a DMT is cost-effective. METHODS: We used an open-source model to conduct two-way scenario analyses for both payer and societal perspectives, varying price, and treatment effect size simultaneously. The analysis generates costeffectiveness threshold prices over a potential range of DMT effectiveness in patients aged 65+ with mild cognitive impairment due to Alzheimer's disease in the US. RESULTS: Under the willingness-to-pay a threshold of $150,000 per quality-adjusted life year and assuming 30% risk reduction relative to the standard of care, the maximum cost-effective price of a DMT per patient per year is ~$22,000 and ~$15,000 from societal and payer perspectives, respectively. CONCLUSION: Joint variation of price and treatment effect size can help assess the cost-effectiveness of a potential Alzheimer's disease treatment.


Subject(s)
Alzheimer Disease/drug therapy , Alzheimer Disease/economics , Cost-Benefit Analysis/methods , Drug Development/economics , Drug Development/methods , Quality-Adjusted Life Years , Humans , Markov Chains , Treatment Outcome
15.
Cancer Med ; 9(2): 440-446, 2020 01.
Article in English | MEDLINE | ID: mdl-31749330

ABSTRACT

BACKGROUND: Standard treatment for locally advanced esophageal cancer usually includes a combination of chemotherapy, radiation, and surgery. In squamous cell carcinoma (SCC), recent studies have indicated that esophagectomy after chemoradiation does not significantly improve survival but may reduce recurrence at the cost of treatment-related mortality. This study aims to evaluate the cost-effectiveness of chemoradiation with and without esophagectomy. METHODS: We developed a decision tree and Markov model to compare chemoradiation therapy alone (CRT) versus chemoradiation plus surgery (CRT+S) in a cohort of 57-year-old male patients with esophageal SCC, over 25 years. We used information on survival, cancer recurrence, and side effects from a Cochrane meta-analysis of two randomized trials. Societal utility values and costs of cancer care (2017, USD) were from medical literature. To test robustness, we conducted deterministic (DSA) and probabilistic sensitivity analyses (PSA). RESULTS: In our base scenario, CRT resulted in less cost for more quality-adjusted life years (QALYs) compared to CRT+S ($154 082 for 1.32 QALYs/patient versus $165 035 for 1.30 QALYs/patient, respectively). In DSA, changes resulted in scenarios where CRT+S is cost-effective at thresholds between $100 000-$150 000/QALY. In PSA, CRT+S was dominant 17.9% and cost-effective at willingness-to-pay of $150 000/QALY 38.9% of the time, and CRT was dominant 30.6% and cost-effective 61.1% of the time. This indicates that while CRT would be preferred most of the time, variation in parameters may change cost-effectiveness outcomes. CONCLUSIONS: Our results suggest that more data is needed regarding the clinical benefits of CRT+S for treatment of localized esophageal SCC, although CRT should be cautiously preferred.


Subject(s)
Chemoradiotherapy/economics , Cost-Benefit Analysis , Esophageal Neoplasms/economics , Esophageal Squamous Cell Carcinoma/economics , Esophagectomy/economics , Chemoradiotherapy/mortality , Combined Modality Therapy , Esophageal Neoplasms/pathology , Esophageal Neoplasms/therapy , Esophageal Squamous Cell Carcinoma/pathology , Esophageal Squamous Cell Carcinoma/therapy , Esophagectomy/mortality , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prognosis , Randomized Controlled Trials as Topic , Survival Rate
16.
Sci Rep ; 9(1): 18514, 2019 12 06.
Article in English | MEDLINE | ID: mdl-31811207

ABSTRACT

People living with HIV/AIDS (PLWHA) have a growing life expectancy in the US due to early provision of effective antiretroviral treatment. This has resulted in increasing exposure to age-related chronic illness that may be exacerbated by HIV/AIDS or antiretroviral treatment. Prior work has suggested that PLWHA may be subject to accelerated aging, with earlier onset and higher risk of acquiring many chronic illnesses. However, the magnitude of these effects, controlling for chronic co-morbidities, has not been fully quantified. We evaluate the magnitude of association of HIV infection on developing chronic conditions while controlling for demographics, behavioral risk factors, and chronic comorbidities. We compare chronic disease risks of diabetes, hypertension, stroke, cancers, lung diseases, cardiovascular diseases, and cognitive impairment between PLWHA and HIV- individuals in a large, de-identified private insurance claims dataset (~24,000 PLWHA) using logistic regressions. HIV status is statistically significantly associated with higher levels for all chronic illnesses examined, a result which is robust to multiple model specifications and duration of analysis (2, 5, and 10 years from enrollment). Our results suggest that PLWHA may be at elevated risk for a wide variety of chronic illnesses and may require additional care as the aging PLWHA population grows.


Subject(s)
Acquired Immunodeficiency Syndrome/complications , Chronic Disease , Comorbidity , HIV Infections/complications , Acquired Immunodeficiency Syndrome/epidemiology , Aged , Anti-HIV Agents/therapeutic use , Cardiovascular Diseases/complications , Databases, Factual , Dementia/complications , Diabetes Complications , Female , HIV Infections/epidemiology , Humans , Hypertension/complications , Insurance, Health , Lung Diseases/complications , Male , Middle Aged , Neoplasms/complications , Odds Ratio , Regression Analysis , Risk Factors , Stroke/complications , Treatment Outcome
17.
Med Decis Making ; 38(4): 452-464, 2018 05.
Article in English | MEDLINE | ID: mdl-29185378

ABSTRACT

BACKGROUND: Microsimulation models often compute the distribution of a simulated cohort's risk factors and medical outcomes over time using repeated waves of cross-sectional data. We sought to develop a strategy to simulate how risk factor values remain correlated over time within individuals, and compare it to available alternative methods. METHODS: We developed a method using shortest-distance matching for modeling changes in risk factors in individuals over time, which preserves both the cohort distribution of each risk factor as well as the cross-sectional correlation between risk factors observed in repeated cross-sectional data. We compared the performance of the method with rank stability and regression methods, using both synthetic data and data from the Framingham Offspring Heart Study (FOHS) to simulate a cohort's atherosclerotic cardiovascular disease (ASCVD) risk. RESULTS: The correlation between risk factors was better preserved using the shortest distance method than with rank stability or regression (root mean squared difference = 0.077 with shortest distance, v. 0.126 with rank stability and 0.146 with regression in FOHS, and 0.052, 0.426 and 0.352, respectively, in the synthetic data). The shortest distance method generated population ASCVD risk estimate distributions indistinguishable from the true distribution in over 99.8% of cases (Kolmogorov-Smirnov, P > 0.05), outperforming some existing regression methods, which produced ASCVD distributions statistically distinguishable from the true one at the 5% level around 15% of the time. LIMITATIONS: None of the methods considered could predict individual longitudinal trends without error. The shortest-distance method was not statistically inferior to rank stability or regression methods for predicting individual risk factor values over time in the FOHS. CONCLUSIONS: A shortest distance method may assist in preserving risk factor correlations in microsimulations informed by cross-sectional data.


Subject(s)
Coronary Artery Disease/epidemiology , Cross-Sectional Studies/methods , Models, Statistical , Age Factors , Aged , Biomarkers , Blood Glucose , Body Mass Index , Computer Simulation , Female , Humans , Lipids/blood , Longitudinal Studies , Male , Middle Aged , Risk Factors , Smoking/epidemiology
18.
Health Care Manag Sci ; 21(4): 632-646, 2018 Dec.
Article in English | MEDLINE | ID: mdl-28861650

ABSTRACT

Effective treatment for tuberculosis (TB) patients on first-line treatment involves triaging those with drug-resistant (DR) TB to appropriate treatment alternatives. Patients likely to have DR TB are identified using results from repeated inexpensive sputum-smear (SS) tests and expensive but definitive drug sensitivity tests (DST). Early DST may lead to high costs and unnecessary testing; late DST may lead to poor health outcomes and disease transmission. We use a partially observable Markov decision process (POMDP) framework to determine optimal DST timing. We develop policy-relevant structural properties of the POMDP model. We apply our model to TB in India to identify the patterns of SS test results that should prompt DST if transmission costs remain at status-quo levels. Unlike previous analyses of personalized treatment policies, we take a societal perspective and consider the effects of disease transmission. The inclusion of such effects can significantly alter the optimal policy. We find that an optimal DST policy could save India approximately $1.9 billion annually.


Subject(s)
Antitubercular Agents/therapeutic use , Tuberculosis, Multidrug-Resistant/diagnosis , Tuberculosis, Multidrug-Resistant/drug therapy , Age Factors , Algorithms , Antitubercular Agents/administration & dosage , Communicable Disease Control/organization & administration , Cost of Illness , Health Policy , Humans , India , Markov Chains , Sex Factors , Sputum/microbiology , Time Factors , Tuberculosis/diagnosis , Tuberculosis/drug therapy
19.
J Theor Biol ; 428: 1-17, 2017 09 07.
Article in English | MEDLINE | ID: mdl-28606751

ABSTRACT

Economic evaluations of infectious disease control interventions frequently use dynamic compartmental epidemic models. Such models capture heterogeneity in risk of infection by stratifying the population into discrete risk groups, thus approximating what is typically continuous variation in risk. An important open question is whether and how different risk stratification choices influence model predictions of intervention effects. We develop equivalent Susceptible-Infected-Susceptible (SIS) dynamic transmission models: an unstratified model, a model stratified into a high-risk and low-risk group, and a model with an arbitrary number of risk groups. Absent intervention, the models produce the same overall prevalence of infected individuals in steady state. We consider an intervention that either reduces the contact rate or increases the disease clearance rate. We develop analytical and numerical results characterizing the models and the effects of the intervention. We find that there exist multiple feasible choices of risk stratification, contact distribution, and within- and between-group contact rates for models that stratify risk. We show analytically and empirically that these choices can generate different estimates of intervention effectiveness, and that these differences can be significant enough to alter conclusions from cost-effectiveness analyses and change policy recommendations. We conclude that the choice of how to discretize risk in compartmental epidemic models can influence predicted effectiveness of interventions. Therefore, analysts should examine multiple alternatives and report the range of results.


Subject(s)
Epidemics , Models, Biological , Risk Assessment , Communicable Disease Control , Gonorrhea/epidemiology , Gonorrhea/transmission , Homosexuality, Male/statistics & numerical data , Humans , Male , Numerical Analysis, Computer-Assisted , Prevalence , Risk Factors
20.
Lancet Glob Health ; 4(11): e806-e815, 2016 11.
Article in English | MEDLINE | ID: mdl-27720688

ABSTRACT

BACKGROUND: The post-2015 End TB Strategy proposes targets of 50% reduction in tuberculosis incidence and 75% reduction in mortality from tuberculosis by 2025. We aimed to assess whether these targets are feasible in three high-burden countries with contrasting epidemiology and previous programmatic achievements. METHODS: 11 independently developed mathematical models of tuberculosis transmission projected the epidemiological impact of currently available tuberculosis interventions for prevention, diagnosis, and treatment in China, India, and South Africa. Models were calibrated with data on tuberculosis incidence and mortality in 2012. Representatives from national tuberculosis programmes and the advocacy community provided distinct country-specific intervention scenarios, which included screening for symptoms, active case finding, and preventive therapy. FINDINGS: Aggressive scale-up of any single intervention scenario could not achieve the post-2015 End TB Strategy targets in any country. However, the models projected that, in the South Africa national tuberculosis programme scenario, a combination of continuous isoniazid preventive therapy for individuals on antiretroviral therapy, expanded facility-based screening for symptoms of tuberculosis at health centres, and improved tuberculosis care could achieve a 55% reduction in incidence (range 31-62%) and a 72% reduction in mortality (range 64-82%) compared with 2015 levels. For India, and particularly for China, full scale-up of all interventions in tuberculosis-programme performance fell short of the 2025 targets, despite preventing a cumulative 3·4 million cases. The advocacy scenarios illustrated the high impact of detecting and treating latent tuberculosis. INTERPRETATION: Major reductions in tuberculosis burden seem possible with current interventions. However, additional interventions, adapted to country-specific tuberculosis epidemiology and health systems, are needed to reach the post-2015 End TB Strategy targets at country level. FUNDING: Bill and Melinda Gates Foundation.


Subject(s)
Achievement , Delivery of Health Care , Goals , Tuberculosis/prevention & control , Antitubercular Agents/therapeutic use , Cause of Death , China , Forecasting , HIV Infections/complications , Health Services Accessibility , Humans , Incidence , India , Isoniazid/therapeutic use , Mass Screening , Models, Theoretical , South Africa , Tuberculosis/epidemiology , Tuberculosis/therapy , Tuberculosis/transmission , World Health Organization
SELECTION OF CITATIONS
SEARCH DETAIL
...